I am not able to use tf.Variable()
They give you a pretty big hint what the code should look like in the instructions for that section. All you have to do is fill in the appropriate “shape” argument, right?
If reading the instructions again doesn’t shed any light, then please actually show us the error trace you are getting. Just saying “I have a problem” doesn’t really give us much to go on … Please bear in mind that the mentors are just fellow students: we do not have the “super power” to look at your notebooks.
I also have a problem with computing cost as I got the same error every time ‘’
I went through the discussion (Cannot compute_cost course 2 week 3 - #9 by Damon)
The error I got:
AssertionError: Test does not match. Did you get the mean of your cost functions?
cost I have written:
cost = tf.reduce_mean(tf.keras.losses.binary_crossentropy(tf.transpose(labels), tf.transpose(logits), from_logits=False))
Ok, but as I mentioned before, just saying “my answer comes out wrong” doesn’t really give us anything to go on. We can’t see your notebooks. Please show us the output you are getting. How do you know it’s wrong? Just “copy/paste” the error output you are seeing.
Also note that the thread you quote is an old thread. In one of my replies there, I added an “Update” to mention that the course staff made significant revisions to the assignment to fix a bunch of the problems that are discussed on that thread.
Ok, your mistake is using binary cross entropy. This is a multiclass problem. That’s why that previous thread is not a good one to read. Please read the actual instructions in the notebook. They specifically tell you which cost function to use and it’s not binary cross entropy, right?
Your value of the from_logits parameter is also incorrect. As you can see from how we wrote forward propagation here, they are logits, right? That’s as opposed to activation outputs.