Can anyone provide week1 assignment-1 total code (total implementation of backpropagation using numpy)

I want to implement CNN using only numpy so can anyone provide the total complete code of C4W1-A1 (which connects forward and backward propagation )

Sorry, but I am not aware of any such code available here. In the C4 W1 A1 assignment, they lead you through the individual functions to do backprop for conv layers and pooling layers, but they don’t give you all the structure to tie it all together. Note that in most ConvNets, you have FC layers at the end and you also have activation functions that you need to include in the Chain Rule calculations. You could start with the complete implementation we have in C1 W4 A2 for the Fully Connected case, including the activation functions in use there, and then extend that with the additional functions in the C4 W1 A1 assignment.

The high level point here is that there is a reason why we switch to using TensorFlow for implementation right at this juncture. Life just gets too complicated. We will also soon need to handle even more functions like Batch Normalization, DropOut regularization, Skip layers for residual networks and transposed convolutions. Nobody doing “real world” problem solving in this space does the whole job in numpy by hand. It’s just too much work. Of course in the “real world” you also have other choices besides TensorFlow. Many people also like PyTorch as a framework, but we don’t cover that here in DLS.