A = sigmoid(np.dot(w.T,X)+b)
cost = -(np.dot(Y.T,np.log(A))-np.dot((1-Y).T,np.log(1-A)))/m
# YOUR CODE ENDS HERE
# BACKWARD PROPAGATION (TO FIND GRAD)
#(≈ 2 lines of code)
# dw = ...
# db = ...
# YOUR CODE STARTS HERE
dw = np.dot(X,(A-Y).T)/m
db = (np.dot(A,1)-np.dot(Y,1))/m
# YOUR CODE ENDS HERE

First, you also need to use sum for cost. Second, your implementation of db is wrong. We don’t need to use dot for db. Check the equations and instructions again.

what I am thinking is A and Y are vector so to write a1+2+…+am-y1-y2-…-ym I have to first add all the values of A vector which can be done by dot product of A and vector 1 of size mX1 and similarily for Y and then subtrating them

Please have a look at the formula 8 again from the implementation instructions, what we want is the sum of all the difference between a^{(i)} and y^{(i)} for i from 1 to m examples and then to take the average of this sum by dividing it by m.