Getting error in C3W3Assignment of NLP Course3

I am getting the error in NLP course 3 Assignment 3: Question duplicates :
I passed first two exercises

  1. Siamese
  2. TripletLossFn

Now in 3 is train_model function where I fill the missing values according to instructions and have checked 100s of times but still getting this below error

Node: ‘SiameseModel/Cast’
2 root error(s) found.

  • (0) UNIMPLEMENTED: Cast string to int32 is not supported*
  • [[{{node SiameseModel/Cast}}]]*
  • (1) CANCELLED: Function was cancelled before it was started*
    0 successful operations.
    0 derived errors ignored. [Op:__inference_train_function_27910]

Anyone here who face the same issue or can guide me about it

2 Likes

Hi @Muhammad_Hamza9

The error you’re getting might be caused due to the way you defined input1 and input2 in Exercise 01 Siamese(..) function. Note, that both inputs in our case are of dtype tf.string.

1 Like

Yes this is the reason. Thank you so much

arvy I used tf.string in input1 and 2, yet got a similar error

Node: ‘SiameseModel/Cast’
2 root error(s) found.
(0) UNIMPLEMENTED: Cast string to int64 is not supported
[[{{node SiameseModel/Cast}}]]
(1) CANCELLED: Function was cancelled before it was started
0 successful operations.
0 derived errors ignored. [Op:__inference_train_function_33073]

For future readers - for dtype= do not use ‘tf.string’ . Use either:

  • ‘string’ (or “string”), or
  • tf.string (without quotes or apostrophes)

Cheers

1 Like

Hi

{moderator edit - solution code removed}

This is my function, and it’s not passing the tests. I don’t event get the right result when executing
v1 = np.array([[0.26726124, 0.53452248, 0.80178373],[0.5178918 , 0.57543534, 0.63297887]])
v2 = np.array([[ 0.26726124, 0.53452248, 0.80178373],[-0.5178918 , -0.57543534, -0.63297887]])

And I don’t understand why since I’m doing everything that is requested :slight_smile:
print(“Triplet Loss:”, TripletLossFn(v1,v2).numpy())

The comment there is pretty specific and what you did does not agree with what it says. That will not end well.