DLS 2 week 3 exercise 6 compute_cost

In the compute cost function I used
cost = tf.reduce_mean(tf.keras.losses.categorical_crossentropy(labels, logits))

but i am getting error. Can someone help me to find where I made a mistake.

9 Likes

Hello @sandra_jayan,

Notice please the shape of the arguments logits and labels as received by the compute_cost function and the shape expected by tf.keras.losses.categorical_crossentropy, maybe you need to adjust that. Also be aware of the from_logits parameter needed in the call to the categorical_crossentropy function.

11 Likes

@kampamocha
I have tried to reshape it but it didn’t work at all.
I have even tried changing from_logits = True.

logits = tf.cast(tf.reshape(logits,(logits.shape[1],logits.shape[0])),dtype=tf.float64)
labels = tf.cast(tf.reshape(labels,(labels.shape[1],labels.shape[0])),dtype=tf.float64)
cost = tf.reduce_mean(tf.keras.losses.categorical_crossentropy( y_true=labels, y_pred=logits))

1 Like

Hi @muhammadahmad,

According to function compute_cost documentation you are receiving arguments with shape (num_classes, num_examples), but you need to pass tensors of shape (num_examples, num_classes) to tf.keras.losses.categorical_crossentropy.

Notice that reshape gives you the correct dimensions but does not transpose rows and columns as needed in this case. Maybe you should try another tensorflow function specific for that operation.

Hope that helps.

17 Likes

@kampamocha

Thanks it worked after taking transpose.

But the expected output mentioned in the assignment is not correct.

8 Likes

Thanks a lot it worked after adding from_logits. Also I had forgotton to take transpose in the above code.

All the excercise passed but I got 80 only. I couldn’t find where I did wrong.

3 Likes

Also, be careful about order of categorical_crossentropy parameters. The first should be true labels, predictions are second. I spend 2 desperate hours before I noticed this :slight_smile:

19 Likes

What does it mean by adding “from_logits = True” here? I know that it works but not sure why. Thank you for your help!

Hi @Nanyin,

from_logits=True indicates that y_pred is not normalized (i.e. does not come from a softmax).
If you keep the default option from_logits=False, the function assumes that y_pred is coming as a probability distribution.

Check the link in this post, for the documentation of categorical_crossentropy function.

12 Likes

My god thank you. I’ve been crazy about half an hour because my outpot is rediculously larger than 100 and I’m just wondering why… Really helpful :dizzy_face:

2 Likes

Thanks very much.
I added from_logits=True, and also transposed the 2 input parameters, labels, logits at first, and got passed.
Thanks,

7 Likes

I have used all of the recommendations - transposing, order of arguments, from_logits=True, take crossentropy then reduce_mean but I am getting a strange error shown below. Any suggestions?

FIXED: I had ran all of the cells above singly when working on assignment but decided to use the “Run All Above” in the Cell menu when in the ‘grading cell’ to run them all again and that fixed the problem.


NameError Traceback (most recent call last)
in
17 print("\033[92mAll test passed")
18
—> 19 compute_cost_test(compute_cost, new_y_train )

NameError: name ‘new_y_train’ is not defined

1 Like

Hi Tony,

Welcome to the community!

Probably, you have missed running a cell that takes the value of new_y_train. What I suggest is, save your work–> go to kernel—> restart and clear all outputs–> then again save your work—> run all the cells from the beginning through shift+enter key.

What happens when you miss running a cell, it doesn’t carry that particular output further and throws error later.

Happy Learning!

1 Like

Makes no sense that the importance of the “from_logits” is not even mentioned i.e. the categorical cross enthropy is not even mentioned in the lectures.

2 Likes

It is covered in ML Specialization - C2-W2

new_y_test = y_test.map(one_hot_matrix)
new_y_train = y_train.map(one_hot_matrix)

This cell doesn’t run successful, it with errors that result in the error in the compute cost, Any help on this, please

1 Like

After running the cell
new_y_test = y_test.map(one_hot_matrix)
new_y_train = y_train.map(one_hot_matrix)

I get an error, Please help

ValueError Traceback (most recent call last)
in
----> 1 new_y_test = y_test.map(one_hot_matrix)
2 new_y_train = y_train.map(one_hot_matrix)

/opt/conda/lib/python3.7/site-packages/tensorflow/python/data/ops/dataset_ops.py in map(self, map_func, num_parallel_calls, deterministic)
1693 “”"
1694 if num_parallel_calls is None:
→ 1695 return MapDataset(self, map_func, preserve_cardinality=True)
1696 else:
1697 return ParallelMapDataset(

/opt/conda/lib/python3.7/site-packages/tensorflow/python/data/ops/dataset_ops.py in init(self, input_dataset, map_func, use_inter_op_parallelism, preserve_cardinality, use_legacy_function)
4043 self._transformation_name(),
4044 dataset=input_dataset,
→ 4045 use_legacy_function=use_legacy_function)
4046 variant_tensor = gen_dataset_ops.map_dataset(
4047 input_dataset._variant_tensor, # pylint: disable=protected-access

/opt/conda/lib/python3.7/site-packages/tensorflow/python/data/ops/dataset_ops.py in init(self, func, transformation_name, dataset, input_classes, input_shapes, input_types, input_structure, add_to_graph, use_legacy_function, defun_kwargs)
3369 with tracking.resource_tracker_scope(resource_tracker):
3370 # TODO(b/141462134): Switch to using garbage collection.
→ 3371 self._function = wrapper_fn.get_concrete_function()
3372 if add_to_graph:
3373 self._function.add_to_graph(ops.get_default_graph())

/opt/conda/lib/python3.7/site-packages/tensorflow/python/eager/function.py in get_concrete_function(self, *args, **kwargs)
2937 “”"
2938 graph_function = self._get_concrete_function_garbage_collected(
→ 2939 *args, **kwargs)
2940 graph_function._garbage_collector.release() # pylint: disable=protected-access
2941 return graph_function

/opt/conda/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _get_concrete_function_garbage_collected(self, *args, **kwargs)
2904 args, kwargs = None, None
2905 with self._lock:
→ 2906 graph_function, args, kwargs = self._maybe_define_function(args, kwargs)
2907 seen_names = set()
2908 captured = object_identity.ObjectIdentitySet(

/opt/conda/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs)
3211
3212 self._function_cache.missed.add(call_context_key)
→ 3213 graph_function = self._create_graph_function(args, kwargs)
3214 self._function_cache.primary[cache_key] = graph_function
3215 return graph_function, args, kwargs

/opt/conda/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
3073 arg_names=arg_names,
3074 override_flat_arg_shapes=override_flat_arg_shapes,
→ 3075 capture_by_value=self._capture_by_value),
3076 self._function_attributes,
3077 function_spec=self.function_spec,

/opt/conda/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
984 _, original_func = tf_decorator.unwrap(python_func)
985
→ 986 func_outputs = python_func(*func_args, **func_kwargs)
987
988 # invariant: func_outputs contains only Tensors, CompositeTensors,

/opt/conda/lib/python3.7/site-packages/tensorflow/python/data/ops/dataset_ops.py in wrapper_fn(*args)
3362 attributes=defun_kwargs)
3363 def wrapper_fn(*args): # pylint: disable=missing-docstring
→ 3364 ret = _wrapper_helper(*args)
3365 ret = structure.to_tensor_list(self._output_structure, ret)
3366 return [ops.convert_to_tensor(t) for t in ret]

/opt/conda/lib/python3.7/site-packages/tensorflow/python/data/ops/dataset_ops.py in _wrapper_helper(*args)
3297 nested_args = (nested_args,)
3298
→ 3299 ret = autograph.tf_convert(func, ag_ctx)(*nested_args)
3300 # If func returns a list of tensors, nest.flatten() and
3301 # ops.convert_to_tensor() would conspire to attempt to stack

/opt/conda/lib/python3.7/site-packages/tensorflow/python/autograph/impl/api.py in wrapper(*args, **kwargs)
256 except Exception as e: # pylint:disable=broad-except
257 if hasattr(e, ‘ag_error_metadata’):
→ 258 raise e.ag_error_metadata.to_exception(e)
259 else:
260 raise

ValueError: in user code:

1 Like

(Solution code removed by staff, as sharing it publicly is against the honour code of the community)

1 Like

This worked thank you! I was racking my brain for 30 mins!

2 Likes

But that is not in this course.

1 Like