C3_W3_A1_Assignment gives error at cell 15

Hi everyone,

sorry for a repost, but adding my issue to an old topic did not yield any reply
i work on the lab id sykympjyaiey

i did the exercises and they passed the test but running cell 15 shows the following issue and i just have no clue how to fix it.
would be great to get any help, thanks a lot in advance


NotImplementedError Traceback (most recent call last)
44 # Set the y targets, perform a gradient descent step,
45 # and update the network weights.
β€”> 46 agent_learn(experiences, GAMMA)
48 state = next_state.copy()

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in call(self, *args, **kwds)
566 xla_context.Exit()
567 else:
β†’ 568 result = self._call(*args, **kwds)
570 if tracing_count == self._get_tracing_count():

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in _call(self, *args, **kwds)
613 # This is the first call of call, so we have to initialize.
614 initializers =
β†’ 615 self._initialize(args, kwds, add_initializers_to=initializers)
616 finally:
617 # At this point we know that the initialization is complete (or less

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in _initialize(self, args, kwds, add_initializers_to)
495 self._concrete_stateful_fn = (
496 self._stateful_fn._get_concrete_function_internal_garbage_collected( # pylint: disable=protected-access
β†’ 497 *args, **kwds))
499 def invalid_creator_scope(*unused_args, **unused_kwds):

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs)
2387 args, kwargs = None, None
2388 with self._lock:
β†’ 2389 graph_function, _, _ = self._maybe_define_function(args, kwargs)
2390 return graph_function

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _maybe_define_function(self, args, kwargs)
2702 self._function_cache.missed.add(call_context_key)
β†’ 2703 graph_function = self._create_graph_function(args, kwargs)
2704 self._function_cache.primary[cache_key] = graph_function
2705 return graph_function, args, kwargs

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
2591 arg_names=arg_names,
2592 override_flat_arg_shapes=override_flat_arg_shapes,
β†’ 2593 capture_by_value=self._capture_by_value),
2594 self._function_attributes,
2595 # Tell the ConcreteFunction to clean up its graph once it goes out of

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
976 converted_func)
β†’ 978 func_outputs = python_func(*func_args, **func_kwargs)
980 # invariant: func_outputs contains only Tensors, CompositeTensors,

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in wrapped_fn(*args, **kwds)
437 # wrapped allows AutoGraph to swap in a converted function. We give
438 # the function a weak reference to itself to avoid a reference cycle.
β†’ 439 return weak_wrapped_fn().wrapped(*args, **kwds)
440 weak_wrapped_fn = weakref.ref(wrapped_fn)

/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/framework/func_graph.py in wrapper(*args, **kwargs)
966 except Exception as e: # pylint:disable=broad-except
967 if hasattr(e, β€œag_error_metadata”):
β†’ 968 raise e.ag_error_metadata.to_exception(e)
969 else:
970 raise

NotImplementedError: in converted code:

<ipython-input-14-ebdf0fb43251>:14 agent_learn  *
    loss = compute_loss(experiences, gamma, q_network, target_q_network)
<ipython-input-12-16a407a82a4a>:37 compute_loss  *
<__array_function__ internals>:6 dot
/opt/conda/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py:728 __array__
    " array.".format(self.name))

NotImplementedError: Cannot convert a symbolic Tensor (sub_2:0) to a numpy array.
1 Like

you error is telling the symbolic tensor sub_2:0 cannot be converted to a numpy array, that sub:2 is creating that argument error

Thanks a lot for your fast reply,
unfortunately i have no clue how i can fix this. is this because my implementation is not fine, which would surprise me as it passed the tests.

Did anyone have the same issue?

thanks in advance for any help

1 Like

Are you running the notebook locally, or on Coursera Labs?

Can you please share the error in a screenshot without sharing any codes

thanks for your reply
i am using the coursera lab

1 Like

Thank you very much for your reply, above the converted code i put the requested error message in the original post.
Unfortunately there is nothing more to add.

kind regards

1 Like

The reason I asked as your original post shows metadata error and the way you posted in the original post, it is not clear.

1 Like

I reviewed your error log in this thread in some detail.

The first thing I found which raised my curiosity is this line of code from your compute_loss() function:

That should be loss = MSE(...), not np.dot(…). This uses the MSE layer from Keras.

Per the instructions in the notebook just above that cell:


Many thanks, i have a look

1 Like

many many thanks TMosh,

i apologize my mistake, i was happy as all tests were passed but you found it anyway

kind regards

1 Like