Hi,
I have two problems with the final programming assignment of the second course (Week 3).
First, as I am writing and running my codes, everything goes well until I get to the block where the forward_propagation_test() is running. It takes some time and suddenly the kernel restarts and a pup up message says: Kernel Restarting
The kernel appears to have died. It will restart automatically.
Kernel errors can occur when the notebook system runs out of memory, often when users run multiple notebooks at once. Please check the list of running notebooks and shut down any notebooks that you are not using.
I checked the running notebooks and I have nothing else open!
Second, I skipped running the test (running the function I wrote instead) and tried to move on and compute the total loss. I wrote this function but the test result is showing a different number:
total_loss = tf.reduce_sum(tf.keras.losses.categorical_crossentropy(labels, logits, from_logits=True))
Would you please guide me to find my error?
Thanks
Hi @Roozbeh_Ghasemi
This issue can occur due to memory limitations or infinite loops in the code, so make sure there are no infinite loops in your code that could be causing the kernel to hang and eventually restart. Try to write optimized code as instructed in notebook.
Now for your second problem, since you are using from_logits=True
, ensure that logits
are raw output from the model (i.e., not passed through a softmax or other activation functions). Also, make sure your model metrics (accuracy, loss, etc.) meet the requirements.
Hope it helps! Feel free to ask if you need further assistance. If you feel stuck, feel free to share your code in private messages with me!
Hi @Alireza_Saei
Thank you so much for your help. My first problem was solved.
About my second problem, I am modifying the function in the assignment as below:
{moderator edit - solution code removed}
It seems the grader tests the function with predefined data. I didn’t use an activation function before defining my cost function, so I assume the logits are raw. The problem appears before running the function in the model.
Any ideas on what I might be doing wrong?
Additionally, when I am running the cell the answer is
Test 1: tf.Tensor(0.17102128, shape=(), dtype=float32)
While the assignment expects:
Test 1: tf.Tensor(0.810287, shape=(), dtype=float32)
When using tf.keras.losses.categorical_crossentropy
, ensure that your labels
are correctly one-hot encoded. Also, double-check earlier cells for any potential implementation errors that could be causing the issue.
You need to transpose the labels and logits in order to get the correct answer here. Here’s a thread with the complete “checklist” for this assignment.
Here’s a thread which talks in more detail about why the transpose is required.
1 Like
Dear @Alireza_Saei and @paulinpaloalto,
Thank you so much for the insights. After transposing the inputs the problem is solved.
Also, the checklist is a big help.
1 Like