# UNGRADED LAB:Scaled Dot-Product Attention doubt

The below snippet codes are from ungraded labs, hence sharing on the public post thread and it is aimed only towards learning and not sharing with other learners.

I am encountering a Type error while working on an ungraded lab which comes from a test cell for
Visualize weights to check for alignment
ax.imshow(alignment, cmap=‘gray’)

How to I resolve this issue @arvyzukai

Regards
DP

There are “Solutions” section at the bottom of the Ungraded lab? Have you missed it by chance or do I misunderstand the issue?

Cheers

1 Like

Although actually I didn’t know at bottom solution were given, when I run with the solution, I still encounter the same error

I am getting the below error

ValueError Traceback (most recent call last)
Cell In[9], line 9
5 weights = calculate_weights(queries, keys)
6 return np.matmul(weights, values)
----> 9 attention_qkv_result = attention_qkv(embedded_fr, embedded_en, embedded_en)
11 print(f"The shape of the attention_qkv function is {attention_qkv_result.shape}“)
12 print(f"Some elements of the attention_qkv function are \n{attention_qkv_result[0:2,:10]}”)

Cell In[9], line 6, in attention_qkv(queries, keys, values)
4 # Replace pass with your code.
5 weights = calculate_weights(queries, keys)
----> 6 return np.matmul(weights, values)

ValueError: matmul: Input operand 0 does not have enough dimensions (has 0, gufunc core with signature (n?,k),(k,m?)->(n?,m?) requires 1)

You might have forgot to `return weights` for calculate_weights. The most probable cause for that. Otherwise you should look at other causes

If the first exercise fails, then the second should too

That was the issue. sorry again to trouble for silly mistakes

1 Like

No problem Deepti. Happy to help as always

1 Like