Exercise 3 - compute_layer_style_cost

Hi
In this exercise in addition to change the dimension, we should transpose the dimension. I did it but it didn’t work?
What is my problem? or how should I fix it?
a_S = tf.transpose(tf.reshape(a_S,shape=[m,n_Hn_W,n_C]),perm=[0,1 ,2])
a_G = tf.transpose(tf.reshape(a_G,shape=[m,n_H
n_W,n_C]),perm=[0,1 ,2])

InvalidArgumentError Traceback (most recent call last)
in
2 a_S = tf.random.normal([1, 4, 4, 3], mean=1, stddev=4)
3 a_G = tf.random.normal([1, 4, 4, 3], mean=1, stddev=4)
----> 4 J_style_layer_GG = compute_layer_style_cost(a_G, a_G)
5 J_style_layer_SG = compute_layer_style_cost(a_S, a_G)
6

in compute_layer_style_cost(a_S, a_G)
25
26 # Computing gram_matrices for both images S and G (≈2 lines)
—> 27 GS = gram_matrix(a_S)
28 GG = gram_matrix(a_G)
29

in gram_matrix(A)
13 #(≈1 line)
14
—> 15 GA = tf.linalg.matmul(A,tf.transpose(A))
16
17 ### END CODE HERE

/usr/local/lib/python3.6/dist-packages/tensorflow/python/util/dispatch.py in wrapper(*args, **kwargs)
199 “”“Call target, and fall back on dispatchers if there is a TypeError.”""
200 try:
→ 201 return target(*args, **kwargs)
202 except (TypeError, ValueError):
203 # Note: convert_to_eager_tensor currently raises a ValueError, not a

3216 return gen_math_ops.batch_mat_mul_v2(
3218
3219 # Neither matmul nor sparse_matmul support adjoint, so we conjugate

1544 return _result
1545 except _core._NotOkStatusException as e:
→ 1546 _ops.raise_from_not_ok_status(e, name)
1547 except _core._FallbackException:
1548 pass

/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/ops.py in raise_from_not_ok_status(e, name)
6841 message = e.message + (" name: " + name if name is not None else “”)
6842 # pylint: disable=protected-access
→ 6843 six.raise_from(core._status_to_exception(e.code, message), None)
6844 # pylint: enable=protected-access
6845

/usr/local/lib/python3.6/dist-packages/six.py in raise_from(value, from_value)

InvalidArgumentError: In[0] mismatch In[1] shape: 1 vs. 3: [16,3,1] [1,3,16] 0 0 [Op:BatchMatMulV2]

I am stuck on the same question. I know the problem is that we’re not reshaping correctly. Look at your error:

InvalidArgumentError: In[0] mismatch In[1] shape: 1 vs. 3: [16,3,1] [1,3,16] 0 0 [Op:BatchMatMulV2]

How can matmul do this calculation? I believe it needs to be unrolled further. It should become [16, 3x1] [3x1, 16]. In other words, 2 dimensional. But I can’t solve this question either. I have been stuck on it for so long

Look at this old thread: Course 4, week 4, coding assignment 2, compute_layer_style_cost

You have 3 dimensions in the output of the reshape. That’s not what the instructions tell you to do. When in doubt, it never hurts to read the instructions again more carefully. Also note that it is not necessary to use the permute argument in this case, but it is critical that you use both reshape and transpose: you can’t just directly reshape to the final shape that you want because it ends up scrambling the data. See the link that @crackingthecode gave just above for more explanation.