Happy Model function

I using Keras for the first time my assignment code is “nfjvjprt” I have written the following code for Assignment week 1

{moderator edit - solution code removed}

I am getting the following error
Test failed
Expected value

[‘ReLU’, (None, 64, 64, 32), 0]

does not match the input value:

[‘Activation’, (None, 64, 64, 32), 0]

AssertionError Traceback (most recent call last)
in
12 [‘Dense’, (None, 1), 32769, ‘sigmoid’]]
13
—> 14 comparator(summary(happy_model), output)

~/work/release/W1A2/test_utils.py in comparator(learner, instructor)
20 “\n\n does not match the input value: \n\n”,
21 colored(f"{a}", “red”))
—> 22 raise AssertionError(“Error in test”)
23 print(colored(“All tests passed!”, “green”))
24

AssertionError: Error in test
Any help to understand debug really appreciated.

You have used the “generic” Activation layer and passed it “relu” as an argument. Try using the explicit ReLU() layer. This was described in the instructions for this section.

Also just as a “style point” here: the error message in this case is pretty clear, once you actually read it closely enough to figure out what is just boilerplate and what is the relevant part. Learning how to parse the error messages can save you quite a bit of time. You can help yourself, instead of waiting 8 hours for someone to see your question on the forum.

It worked using the.ReLU
Unfortunately from the error I could not figure out I have to use the layer ReLU not the activation function passing the argument ‘relu’, so how did you conclude at using 'ReLU from the error? Also, what is the difference between those functions?

In the same notebook I writing the function nfjvjprt I am getting errror though I trying to read the error I have no clue what it’s saying. Can you please help me reading the error. I am stuck here. I can write the code here but may that’s not wanted.
def convolutional_model(input_shape):

The error read as
OperatorNotAllowedInGraphError Traceback (most recent call last)
in
----> 1 conv_model = convolutional_model((64, 64, 3))
2 conv_model.compile(optimizer=‘adam’,
3 loss=‘categorical_crossentropy’,
4 metrics=[‘accuracy’])
5 conv_model.summary()

in convolutional_model(input_shape)
39
40 Z1 = tfl.Conv2D(8 , (4,4) ,strides = (1,1) , padding=‘SAME’)(input_img)
—> 41 A1 = tfl.ReLU(Z1)
42 P1 = tfl.MaxPool2D(pool_size=(8,8), strides=(8,8), padding=‘SAME’)
43 #P1 = tfl.MaxPool2D((8,8), strides=(2, 2), padding=‘same’):

/opt/conda/lib/python3.7/site-packages/tensorflow/python/keras/layers/advanced_activations.py in init(self, max_value, negative_slope, threshold, **kwargs)
344 def init(self, max_value=None, negative_slope=0, threshold=0, **kwargs):
345 super(ReLU, self).init(**kwargs)
→ 346 if max_value is not None and max_value < 0.:
347 raise ValueError('max_value of Relu layer ’
348 'cannot be negative value: ’ + str(max_value))

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in bool(self)
875 TypeError.
876 “”"
→ 877 self._disallow_bool_casting()
878
879 def nonzero(self):

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in _disallow_bool_casting(self)
488 else:
489 # Default: V1-style Graph execution.
→ 490 self._disallow_in_graph_mode("using a tf.Tensor as a Python bool")
491
492 def _disallow_iteration(self):

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/ops.py in _disallow_in_graph_mode(self, task)
477 raise errors.OperatorNotAllowedInGraphError(
478 “{} is not allowed in Graph execution. Use Eager execution or decorate”
→ 479 " this function with @tf.function.".format(task))
480
481 def _disallow_bool_casting(self):

OperatorNotAllowedInGraphError: using a tf.Tensor as a Python bool is not allowed in Graph execution. Use Eager execution or decorate this function with @tf.function.

That’s the part of the error message that tells you that it is looking for “ReLU” as a function name, but you gave it “Activation”. Your code would actually work, I think (has the same effect), but the test was written to check specifically for that function name. It is frequently the case in programming that there can be more than one way to write the code that works. Normally they only judge by the output of your code, but unfortunately in this case they are checking for the actual function name.

For the new error, about OperatorNotAllowedInGraph, that’s harder to interpret. What it’s telling you is that you did not call the ReLU layer correctly. In this second part of the exercise, we’re doing the Functional API, not the Sequential API. Here’s a thread that is about the same mistake that you made. Please have a look and that should explain it.

Thanks for the suggestion, I passed all the tests.

When I submitted my assignment code “nfjvjprt” with all the test passed everything working the grader output is zero, it says failed? Can you please let me know what I doing wrong?

Sorry, but no-one else can see your notebook or your scores. Please show us the grader output. Not just the score but the “Show grader output” results.

1 Like

Thanks I

I just checked now as it’s updated 100/100

That’s good news! One thing that may have happened is that the “Submit” mechanism here in Course 4 does not do an automatic “Save” for you when you click submit. So what the grader sees is whatever the notebook looked like the last time you clicked “Save”. So even though the code you are looking at is correct, that may not be what the grader sees. In other words, maybe what changed between the failed submit and the successful submit is just that you clicked “Save” to make things consistent.

Also note that all that huge spew of stuff that sounds like errors down to and including the “Timeout (30s)” and “Interrupting Kernel” is just normal stuff that the grader says in most of the Course 4 assignments. That’s not the problem. We asked the Course Staff and apparently that’s just a “Coursera Thing ™” and there’s no way to suppress those confusing and irrelevant messages. Sigh …