Extra weights confusion

In week 2, Gradient Descent on m examples, Professor says that the number of weights increases as the number of features increase for our problem.

  1. For this particular logistic regression problem of classifying cats, I’m confused why there are only two weights considered?

  2. Also, can we create our training set as initial and final weights for the intensities of each image and update accordingly?

  3. Please correct me if I’m wrong, are x1, x2,…, xn here considered as intensities of one single image with their corresponding weights w1, w2,…, wn? Which develops the equation z = w1x1 + w2x2 + b. If so, then why are only two weights considered for our algorithm, why not 12288 as per the initial derivation in the first video of binary classification?

Help me correct my doubts, multiple confusion came up for me after watching the video for gradient descent over m examples.

Thanks and Regards,

Prof Ng gives some examples in the lectures in which he shows a problem in which there are only 2 or 3 features, just for simplicity in writing out the formulas. But everything we are doing here is completely general and can handle any number of input “features”. The number of weights is exactly the same as the number of features. In the actual dataset we use for the exercise, the input images are 64 x 64 x 3 RGB images, so the number of features is 64 * 64 * 3 = 12288.

I don’t understand question 2). The training set is the images. You can choose the size of images that you want to use: image processing libraries all support “downsampling” images to any size, but the point is our models require all inputs to have the same number of features and be of the same type (RGB , CMYK, greyscale …). For Logistic Regression, we initialize the weight and bias values to zero. Next week when we get to real Neural Networks, that will no longer work and we will need random initialization to provide “Symmetry Breaking”. Here’s a thread which discusses that in detail.

Question 3 is answered by the first paragraph above.

1 Like

Thankyou Paul for the explanation. The second question came out of confusion before I watched the videos in the 2nd part of week 2. Now, I have completed my assignment, the videos and also major doubts are clear.