Question about Lab 1 data type

In running the lab 1 code below, I seem to be getting results of different data type compared to the expected results. what is the difference between ndarray<TF.tenor> versus <TF.tensor>

buckets tf.Tensor(
[[3 3 3 3 3 3 3 3]
[3 3 3 3 3 3 3 3]
[3 3 3 3 3 3 3 3]], shape=(3, 8), dtype=int32)

compared to

buckets ndarray<tf.Tensor(
[[3 3 3 3 3 3 3 3]
 [3 3 3 3 3 3 3 3]
 [3 3 3 3 3 3 3 3]], shape=(3, 8), dtype=int32)>
def our_hash_vectors(vecs, rng, n_buckets, n_hashes, mask=None, verbose=False):
  # moderator edit: code removed
  return buckets

ohv_q = np.ones((8, 5)) # (seq_len=8, n_q=5)
ohv_n_buckets = 4 # even number
ohv_n_hashes = 3
with fastmath.use_backend(“tensorflow-numpy”):
ohv_rng = fastmath.random.get_prng(1)
ohv = our_hash_vectors(
ohv_q, ohv_rng, ohv_n_buckets, ohv_n_hashes, mask=None, verbose=True
)
print(“ohv shape”, ohv.shape, “\nohv”, ohv) # (ohv_n_hashes * ohv_n_buckets)

note the random number generators do not produce the same results with different backends

with fastmath.use_backend(“jax”):
ohv_rng = fastmath.random.get_prng(1)
ohv = our_hash_vectors(ohv_q, ohv_rng, ohv_n_buckets, ohv_n_hashes, mask=None)
print(“ohv shape”, ohv.shape, “\nohv”, ohv) # (ohv_n_hashes * ohv_n_buckets)

random.rotations.shape (5, 3, 2) random_rotations reshaped (5, 6) rotated_vecs1 (8, 6) rotated_vecs2 (8, 3, 2) rotated_vecs3 (3, 8, 2) rotated_vecs.shape (3, 8, 4) buckets.shape (3, 8) buckets tf.Tensor( [[3 3 3 3 3 3 3 3] [3 3 3 3 3 3 3 3] [3 3 3 3 3 3 3 3]], shape=(3, 8), dtype=int32) buckets with offsets (1, 24) tf.Tensor([[ 3 3 3 3 3 3 3 3 7 7 7 7 7 7 7 7 11 11 11 11 11 11 11 11]], shape=(1, 24), dtype=int32) ohv shape (1, 24) ohv tf.Tensor([[ 3 3 3 3 3 3 3 3 7 7 7 7 7 7 7 7 11 11 11 11 11 11 11 11]], shape=(1, 24), dtype=int32) using jax ohv shape (1, 24) ohv [[ 3 3 3 3 3 3 3 3 5 5 5 5 5 5 5 5 11 11 11 11 11 11 11 11]]

Please do not post your code on the forum. That’s not allowed by the Code of Conduct.

Thank you.

Hi @PZ2004

Your question formatting is hart to read. The difference (from the “Expected output”) is:

ohv shape (24,)
ohv ndarray<tf.Tensor([ 3  3  3  3  3  3  3  3  7  7  7  7  7  7  7  7 11 11 11 11 11 11 11 11], shape=(24,), dtype=int32)>

versus using jax:

ohv shape (24,)
ohv [ 3  3  3  3  3  3  3  3  5  5  5  5  5  5  5  5 11 11 11 11 11 11 11 11]

Note the difference 5 vs. 7.

Does that answer the question?