Hi,
At the end of the forward propagation implementation in the blue fonted “what you should remember”, it says:
" A convolution extracts features from an input image by taking the dot product between the input data and a 3D array of weights (the filter)."
My understanding from the course is that it is not the dot product, but rather the sum of element-wise multiplication. Am I mistaken?
Thanks!
Hello @1492r,
Your understanding is correct! Only that the lab’s description is also not wrong.
A common understanding of a dot product is the dot product between 2 vectors. Such dot product is also the sum of element-wise multiplication as you said.
Now we know that the filter and the image are both not vectors, so what is the dot product between two non-vectors, or between two tensors? I think the key here is to expand the rule of “sum of element-wise multiplication” to tensors.
Below is an example of using a numpy function called tensordot
to do a “tensor-dot” operation on two tensors that are both not vectors, and the result of the tensordot
is 13
and you can easily verify it by doing the “sum of element-wise multiplication” by hand.
filters = np.array([
[1., 2.],
[2., 3.],
])
image = np.array([
[1., 2.],
[1., 2.],
])
np.tensordot(filters, image) #13
Cheers,
Raymond
Hi, Raymond,
Thank you for the quick and detailed reply!
It’s been a very long time since when I learned this part of math in school. But my impression of the dot product of two matrices should be a matrix. In your example it should be:
[(1X1+1X2), (1X2+2X2)],
[(2X1+3X1), (2X2+3X2)]
Correct me if I am wrong on this. If I am not, do you suggest in deep learning a different definition is applied?
Thanks!
Hello @1492r,
I think you are talking about matrix multiplication which multiplies two matrices to produce a third matrix.
Matrix multiplication is different from dot product. In general maths, they are still different. They are not special cases in deep learning and we do not have different definitions in deep learning.
Cheers,
Raymond
Aha. Good to have this confirmation!
Thank you, Raymond!
You are welcome, @1492r!