my attention weights function before passed all tests and i keep getting this error, i also tried using out = softmax(s) and out = activator(s)
In Step 2.C, you made two mistakes:
- you hard-coded the value 11.
- you did not use the “output_layer()” that was discussed in the instructions for Exercise 2…
Also, please don’t post your code on the Forum. That breaks the Honor Code.
If a mentor needs to see your code, they’ll ask you to send it via a private message.
thank you for the help. Can you please explain what is the difference between this function output_layer and what i did with dense(…softmax)?
It’s a matter of “scope of variables”, which is essential for programming.
output_layer is a global variable which can be referred by several components in here. Even if this is changed by some reasons, other sub components do not need to change any. But, if you hard-code it in your components, then, you need to modify by yourself everytime. It is sure to be a big problem.
And, output_layer is NOT a hard-coded “Dense” layer. It takes “machine_vocab” as a parameter. It’s not a hard coded value like you did. And, “machine_vocab” comes from the dataset length, which also varies depending to the dataset. So, all are designed to fit to any dataset which may have different data size. Hard coding easily breaks this design.
Very informative, thanks alot. Basically what i missed was that my hard-coded dense layer would reinstatntiate with new parameters for each output rather than having the same parameters for all the outputs