C4W1 Exercise 3:
I am getting below error when using tf.nn.log_softmax in the Decoder. Weird is that all the test case pass and the implementation check fails.
TypeError: Could not build a TypeSpec for KerasTensor(type_spec=TensorSpec(shape=(64, 15, 12000), dtype=tf.float32, name=None), name='dense/LogSoftmax:0', description="created by layer 'dense'") of unsupported type <class 'keras.src.engine.keras_tensor.KerasTensor'>.
@Amitabh_G okay, by way of advice as memory serves me this last class is suddenly heavily object oriented, much more so than any of the earlier ones.
So for example you want to get used to doing:
self.output_layer type calls rather than trying to implement the sequential API.
Unfortunately I have not setup Tensorflow to work with my local Juypter notebooks yet (and actually have to figure out how I do that), or I’d run a type test on it for you and at least tell you what it should be.
Personally I am leaning more towards the end you might have a little error in your code more than a backend issue. Perhaps a mentor that has these notebooks active can give you more input.
You are suppose to use tf.nn.log_softmwx that is correct
The issue is not with that. you are probably not passing the right argument somewhere in your codes, so either share a complete screenshot of your error or
refer the below link to find the solution.
Also remember to provide a complete screenshot of the error,.if it is lengthy take two separate screenshots as that gives information on why you got the error.
Yes, the assignment is in the object oriented fashion now and was not the case before. The class uses the functional API of tensorflow now and doesnt use the sequential API. As you suggested I have taken care in implementing the classes properly. I will surely recheck my code again…best case scenario is that I find the error in the code and resolve it. Thanks @Nevermnd for your help!
Your codes are incorrect. Read the instructions carefully, also remember each grade cell are assigned with arguments and you are suppose to use those arguments.
another mistake I noticed, for recalling decoder from previous DecoderLayer, you are not suppose to use DECODER, that is incorrect. Remember those grade cell comes init and call functions.
So for this cell when you are using decoder, it should be recalled as self.decoder.
But I suspect more mistakes as I can see you are incorrect recall variables (global/local) Capital VOCAB_SIZE and UNIT is global variable here.
@Deepti_Prasad thanks for your inputs. I have not modified or changed the arguments in the Graded cell. The cell in question is the implementation check cell. So, it is frozen. I have noticed the Capital VOCAB_SIZE and UNITS and those are not used as arguments in the definitions. @Nevermnd Sure, The units in the dense layer is the vocab size.
Get the embedding for the input, you are using incorrect input, remember again as mentioned before you are suppose to recall codes as per assigned arguments. So if you check in the call section your input would be target and not the way you used to keras input shape for call decoder codes