Unable to pass Siamese Network test

GRADED FUNCTION: Siamese

from tensorflow import math
def Siamese(text_vectorizer, vocab_size=36224, d_feature=128):
“”"Returns a Siamese model.

Args:
    text_vectorizer (TextVectorization): TextVectorization instance, already adapted to your training data.
    vocab_size (int, optional): Length of the vocabulary. Defaults to 36224, which is the vocabulary size for your case.
    d_model (int, optional): Depth of the model. Defaults to 128.
    
Returns:
    tf.model.Model: A Siamese model. 

"""
### START CODE HERE ###

# mentor edit: code removed 

### END CODE HERE ###

return tf.keras.models.Model(inputs=[input1, input2], outputs=conc, name="SiameseModel")

check your model

model = Siamese(text_vectorization, vocab_size=text_vectorization.vocabulary_size())
model.build(input_shape=None)
model.summary()
model.get_layer(name=‘sequential’).summary()

Expected output is matching

Model: “SiameseModel”


Layer (type) Output Shape Param # Connected to

input_1 (InputLayer) [(None, 1)] 0

sequential (Sequential) (None, 128) 4768256 [‘input_1[0][0]’,
‘input_1[0][0]’]

input_2 (InputLayer) [(None, 1)] 0

conc_1_2 (Concatenate) (None, 256) 0 [‘sequential[0][0]’,
‘sequential[1][0]’]

==================================================================================================
Total params: 4768256 (18.19 MB)
Trainable params: 4768256 (18.19 MB)
Non-trainable params: 0 (0.00 Byte)


Model: “sequential”


Layer (type) Output Shape Param #

text_vectorization (TextVe (None, None) 0
ctorization)

embedding (Embedding) (None, None, 128) 4636672

LSTM (LSTM) (None, None, 128) 131584

mean (GlobalAveragePooling (None, 128) 0
1D)

out (Lambda) (None, 128) 0

=================================================================
Total params: 4768256 (18.19 MB)
Trainable params: 4768256 (18.19 MB)
Non-trainable params: 0 (0.00 Byte)


Expected output:


Layer (type) Output Shape Param # Connected to

input_1 (InputLayer) [(None, 1)] 0

input_2 (InputLayer) [(None, 1)] 0

sequential (Sequential) (None, 128) 4768256 [‘input_1[0][0]’,
‘input_2[0][0]’]

conc_1_2 (Concatenate) (None, 256) 0 [‘sequential[0][0]’,
‘sequential[1][0]’]

==================================================================================================
Total params: 4768256 (18.19 MB)
Trainable params: 4768256 (18.19 MB)
Non-trainable params: 0 (0.00 Byte)


Model: “sequential”


Layer (type) Output Shape Param #

text_vectorization (TextVe (None, None) 0
ctorization)

embedding (Embedding) (None, None, 128) 4636672

LSTM (LSTM) (None, None, 128) 131584

mean (GlobalAveragePooling (None, 128) 0
1D)

out (Lambda) (None, 128) 0

=================================================================
Total params: 4768256 (18.19 MB)
Trainable params: 4768256 (18.19 MB)
Non-trainable params: 0 (0.00 Byte)

but grader not passing all

Test your function!

w3_unittest.test_Siamese(Siamese)
Layer ‘sequential’ has an incorrect type.
Expected:<class ‘keras.src.engine.input_layer.InputLayer’>,
Got:<class ‘keras.src.engine.sequential.Sequential’>.

Layer ‘input_2’ has an incorrect type.
Expected:<class ‘keras.src.engine.sequential.Sequential’>,
Got:<class ‘keras.src.engine.input_layer.InputLayer’>.

Layer ‘sequential’ has an incorrect input shape.
Expected:[(None, 1)],
Got:(None,).

Layer ‘sequential’ has an incorrect output shape.
Expected:[(None, 1)],
Got:(None, 128).

Layer ‘input_2’ has an incorrect input shape.
Expected:(None,),
Got:[(None, 1)].

Layer ‘input_2’ has an incorrect output shape.
Expected:(None, 128),
Got:[(None, 1)].

Layer ‘sequential’ has an incorrect input shape.
Expected:[(None, 1)],
Got:(None,).

Layer ‘sequential’ has an incorrect output shape.
Expected:[(None, 1)],
Got:(None, 16).

Layer ‘input_2’ has an incorrect input shape.
Expected:(None,),
Got:[(None, 1)].

Layer ‘input_2’ has an incorrect output shape.
Expected:(None, 16),
Got:[(None, 1)].

15 tests passed
10 tests failed

please help with solution

1 Like

Hello @Vishwam_Gupta

I think this mistake is by mistake, you forgot to mention the dtype for both the inputs.

Also vishwam, the updated course 3 has now only 3 weeks. So the assignment you are having issue, now comes in 3rd week of Course 3

Regards
DP

1 Like