Week 2, Assignment 2, alpaca_model: tensor type issue?

Week 2: Transfer Learning with MobileNet

I am having problems with alpaca_model (Exercise 2). If I understand correctly, to get outputs, I should use the decode_predictions() function. In the example above the exercise, it is used as decoded_predictions = decode_predictions(pred.numpy(), top=2). The first thing I tried was to use it with x.numpy() as an input.

That results in:

'Tensor' object has no attribute 'numpy'

I then tried converting x to a numpy array using np.array(x), which gave

NotImplementedError: Cannot convert a symbolic Tensor (dropout_30/cond/Identity:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported

When I checked the example, I noticed that the pred object was of type <class 'tensorflow.python.framework.ops.EagerTensor'>. By contrast x in alpaca_model is <class 'tensorflow.python.framework.ops.Tensor'>

I can’t seem to figure out if my last step is wrong, or if my input to it is wrong. Should my tensor object be EagerTensor as well?

PS: My runner-up theory for what’s wrong here is the data augmentation step. I am running that as x = data_augmentation(tf.expand_dims(inputs[-1], 0)). That is the correct subsetting, right?

If you are talking about the call to data_augmentation in your alpaca model code, why do you need to do any slicing followed by dimension expansion? Just feed the inputs tensor to the function. The inputs will be a batch of images, so it will already have the full dimensions of samples x h x w x c, right?

I did not find any need to call decode_predictions in the later logic in that notebook. Note that the output of the new model we are building here is no longer a multiclass output: it’s a binary classification of “alpaca” or “not alpaca”, right?

1 Like

Well, the call to data_augmentation was part of the problem, I called it as I did because in the example it is used as augmented_image = data_augmentation(tf.expand_dims(first_image, 0)).

Just putting the inputs tensor in runs without error, but there are still issues. I took the hint that decode_predictions is not what I want to use and tried calling base_model again. If that wasn’t what I wanted to do either, I am really confused.

base_model at the last step results in:

WARNING:tensorflow:Model was constructed with shape (None, 160, 160, 3) for input Tensor("input_52:0", shape=(None, 160, 160, 3), dtype=float32), but it was called on an input with incompatible shape (None, 2, 2, 1280).

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-68-11ffc7a7acb3> in <module>
----> 1 model2 = alpaca_model(IMG_SIZE, data_augmentation)

<ipython-input-67-71e36ff5ca6d> in alpaca_model(image_shape, data_augmentation)
     49 
     50     # use a prediction layer with one neuron (as a binary classifier only needs one)
---> 51     outputs = base_model(x)
     52 #     print("outputs are", outputs)
     53 

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    924     if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
    925       return self._functional_construction_call(inputs, args, kwargs,
--> 926                                                 input_list)
    927 
    928     # Maintains info about the `Layer.call` stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
   1115           try:
   1116             with ops.enable_auto_cast_variables(self._compute_dtype_object):
-> 1117               outputs = call_fn(cast_inputs, *args, **kwargs)
   1118 
   1119           except errors.OperatorNotAllowedInGraphError as e:

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/functional.py in call(self, inputs, training, mask)
    384     """
    385     return self._run_internal_graph(
--> 386         inputs, training=training, mask=mask)
    387 
    388   def compute_output_shape(self, input_shape):

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/functional.py in _run_internal_graph(self, inputs, training, mask)
    506 
    507         args, kwargs = node.map_arguments(tensor_dict)
--> 508         outputs = node.layer(*args, **kwargs)
    509 
    510         # Update tensor_dict.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in __call__(self, *args, **kwargs)
    924     if _in_functional_construction_mode(self, inputs, args, kwargs, input_list):
    925       return self._functional_construction_call(inputs, args, kwargs,
--> 926                                                 input_list)
    927 
    928     # Maintains info about the `Layer.call` stack.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/base_layer.py in _functional_construction_call(self, inputs, args, kwargs, input_list)
   1090       # TODO(reedwm): We should assert input compatibility after the inputs
   1091       # are casted, not before.
-> 1092       input_spec.assert_input_compatibility(self.input_spec, inputs, self.name)
   1093       graph = backend.get_graph()
   1094       # Use `self._name_scope()` to avoid auto-incrementing the name.

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/engine/input_spec.py in assert_input_compatibility(input_spec, inputs, layer_name)
    214                 ' incompatible with the layer: expected axis ' + str(axis) +
    215                 ' of input shape to have value ' + str(value) +
--> 216                 ' but received input with shape ' + str(shape))
    217     # Check shape.
    218     if spec.shape is not None:

ValueError: Input 0 of layer Conv1 is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape [None, 3, 3, 1280]


Edit: Oh wait, I should be using Dense for the output layer, right?

Ok so yeah, it seems Dense was the thing to do. Now at least I can get to the end of that cell, but the test gives me:

Test failed 
 Expected value 

 ['GlobalAveragePooling2D', (None, 1280), 0] 

 does not match the input value: 

 ['AveragePooling2D', (None, 2, 2, 1280), 0]

---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-98-0346cb4bf847> in <module>
     10                     ['Dense', (None, 1), 1281, 'linear']] #linear is the default activation
     11 
---> 12 comparator(summary(model2), alpaca_summary)
     13 
     14 for layer in summary(model2):

~/work/W2A2/test_utils.py in comparator(learner, instructor)
     21                   "\n\n does not match the input value: \n\n",
     22                   colored(f"{a}", "red"))
---> 23             raise AssertionError("Error in test")
     24     print(colored("All tests passed!", "green"))
     25 

AssertionError: Error in test

Oh wait, I see what I did wrong, got it now, thanks for the help!

Right, the first rule of debugging is “Believe the error message!” If you don’t understand what it’s telling you, that’s the first problem you need to solve. Glad to hear that you got it all sorted out. :nerd_face: